Agenda

Procesamiento de Texto

Ver archivo Python

Untitled2
In [1]:
import pandas as pd
import urllib
import numpy as np
import urllib.request
import re
from textblob import TextBlob
%run lib.py
In [2]:
#name="Legally%20Blonde"
#name="aboutmary"
#name="10Things"
name="magnolia"
#name="Friday%20The%2013th"
#name="Ghost%20Ship"
#name="Juno"
#name="Reservoir+Dogs"
#name="shawshank"
#name="Sixth%20Sense,%20The"
#name="sunset_bld_3_21_49"
#name="Titanic"
#name="toy_story"
#name="trainspotting"
#name="transformers"
#name="the-truman-show_shooting"
#name="batman_production"
In [3]:
ext="html"
txtfiles=["Ghost%20Ship", "Legally%20Blonde", "Friday%20The%2013th", "Juno", "Reservoir+Dogs", "Sixth%20Sense,%20The", "Titanic"]
if name in txtfiles:
    ext="txt"
fp = urllib.request.urlopen("http://www.dailyscript.com/scripts/"+name+"."+ext)
mybytes = fp.read()

mystr = mybytes.decode("utf8", "ignore")
fp.close()
liston=mystr.split("\n")
liston=[s.replace('\r', '') for s in liston]
liston=[re.sub('<[^<]+?>', '', text) for text in liston]
In [4]:
if name=="shawshank":
    liston=[i.replace("\t", "    ") for i in liston]
In [5]:
char=""
script=[]
charintro='                                 '
endofdialogue='          '
dialoguepre='                    '
newscenepre='          '
charintro=''
endofdialogue=''
dialoguepre=''
newscenepre=''
i=45
print("Characters")
i, charintro=nextbigchunk(liston, i)
print("Adverbs")
i, adverb=nextbigchunk(liston, i, adverbs=True)
print("Dialogues")
i, dialoguepre=nextbigchunk(liston, i)
print("New Scene:")
i, newscenepre=nextbigchunk(liston, i)

if newscenepre=="X":
    i=100
    i, newscenepre=nextbigchunk(liston, i)
    if name=="aboutmary":
        newscenepre=" ".join(["" for i in range(56)])
    if len(newscenepre)==len(charintro):
        newscenepre="X"
    

endofdialogue=newscenepre
    

scene=1
for s in liston:
    if s[0:len(charintro)]==charintro and s[len(charintro)]!=" " and s.strip()[0]!="(" and s.strip()[len(s.strip())-1]!=")":
        #print("Charatcer*****")
        char=s[len(charintro):]
        new=dict()
        new['char']=char.strip()
        new['dialogue']=""
        new['scene']=scene
        new['adverb']=""
    if s==endofdialogue or s.replace(" ", "")=="":
        if char!="":
            char=""
            script.append(new)
    if char!="" and s[0:len(dialoguepre)]==dialoguepre and s[len(dialoguepre)]!=" ":
        #print("Dialogue******")
        if new['dialogue']!="":
            new['dialogue']=new['dialogue']+" "
        new['dialogue']=new['dialogue']+s[len(dialoguepre):]
    if char!="" and ((s[0:len(adverb)]==adverb and s[len(adverb)]!=" ") or (len(s)>1 and s.strip()[0]=="(" and s.strip()[len(s.strip())-1]==")" )):
        if new['adverb']!="":
            new['adverb']=new['adverb']+" "
        new['adverb']=new['adverb']+s[len(adverb):]
    if s[0:len(newscenepre)]==newscenepre and len(s)>len(newscenepre) and ( s.isupper()) and s[len(newscenepre)]!=" ":
        scene=scene+1
Characters
                                magnolia
                                NARRATOR
                                NARRATOR
                                NARRATOR
                                NARRATOR
                                NARRATOR
Adverbs
Dialogues
                      In the New York Herald, November 26,
                      year 1911, there is an account of the
                      hanging of three men --
                      ...they died for the murder of
                      Sir Edmund William Godfrey --
                      -- Husband, Father, Pharmacist and all
New Scene:
     a P.T. Anderson picture                             11/10/98
     a Joanne Sellar/Ghoulardi Film Company production
     
     
     
     
In [6]:
pd.DataFrame(script).to_csv(name+'.csv', index=None)
pd.DataFrame(script)
Out[6]:
adverb char dialogue scene
0 magnolia 1
1 NARRATOR In the New York Herald, November 26, year 1911... 2
2 NARRATOR ...they died for the murder of Sir Edmund Will... 2
3 NARRATOR -- Husband, Father, Pharmacist and all around ... 2
4 NARRATOR Greenberry Hill, London. Population as listed. 3
5 NARRATOR He was murdered by three vagrants whose motive... 5
6 NARRATOR ...Joseph Green..... 5
7 NARRATOR ...Stanley Berry.... 5
8 NARRATOR ...and Nigel Hill... 5
9 NARRATOR Green, Berry and Hill. 7
10 NARRATOR ...And I Would Like To Think This Was Only A M... 7
11 NARRATOR As reported in the Reno Gazzette, June of 1983... 9
12 NARRATOR --- the water that it took to contain the fire -- 10
13 NARRATOR -- and a scuba diver named Delmer Darion. 12
14 NARRATOR Employee of the Peppermill Hotel and Casino, R... 15
15 NARRATOR -- well liked and well regarded as a physical,... 16
16 NARRATOR -- as reported by the coroner, Delmer died of ... 21
17 NARRATOR ...volunteer firefighter, estranged father of ... 24
18 NARRATOR -- added to this, Mr. Hansen's tortured life m... 26
19 CRAIG HANSEN ...oh God...fuck...I'm sorry...I'm sorry... 27
20 NARRATOR The weight of the guilt and the measure of coi... 27
21 CRAIG HANSEN ...forgive me... 27
22 NARRATOR And I Am Trying To Think This Was All Only A M... 29
23 NARRATOR The tale told at a 1961 awards dinner for the ... 32
24 NARRATOR Seventeen year old Sydney Barringer. In the ci... 33
25 NARRATOR The coroner ruled that the unsuccessful suicid... 33
26 NARRATOR The suicide was confirmed by a note, left in t... 34
27 NARRATOR At the same time young Sydney stood on the le... 35
28 NARRATOR The neighbors heard, as they usually did, the... 36
29 NARRATOR -- and it was not uncommon for them to threat... 37
... ... ... ... ...
1493 DIXON We gotta get his money so we can get outta her... 382
1494 WORM That idea is over now. We're not gonna do tha... 382
1495 (to Stanley) DIXON DADDY, FUCK, DADDY, DON'T GET MAD AT ME. DON'T... 382
1496 WORM I'm not mad, son, I will not be mad at you an... 382
1497 DIXON DAD. 382
1498 DIXON I - just - thought - that - I - didn't want - ... 382
1499 WORM It's ok, boy. 382
1500 MUSIC/KERMIT THE FROG "It's not that easy bein' green... Having to s... 383
1501 DONNIE My teeff...my teeef.... 385
1502 JIM KURRING YOU'RE OK...you're gonna be ok.... 385
1503 NARRATOR And there is the account of the hanging of thr... 390
1504 NARRATOR There are stories of coincidence and chance an... 391
1505 NARRATOR ...and we generally say, "Well if that was in... 392
1506 DOCTOR Are you with us? Linda? Is it Linda? 394
1507 NARRATOR Someone's so and so meet someone else's so and... 395
1508 NARRATOR And it is in the humble opinion of this narrat... 398
1509 STANLEY Dad...Dad. 399
1510 STANLEY You have to be nicer to me, Dad. 399
1511 RICK Go to bed. 399
1512 STANLEY I think that you have to be nicer to me. 399
1513 RICK Go to bed. 399
1514 NARRATOR ...and so it goes and so it goes and the book... 400
1515 MARCIE I killed him. I killed my husband. He hit my... 401
1516 DONNIE I know that I did a thtupid thing. Tho-thtupid... 402
1517 DONNIE I really do hath love to give, I juth don't kn... 402
1518 JIM KURRING ...these security systems can be a real joke. ... 403
1519 DONNIE ....ohh-thur-I-thur-thill.... 403
1520 JIM KURRING You guys make alotta money, huh? 403
1521 (beat) JIM KURRING ...alot of people think this is just a job tha... 405
1522 END. 406

1523 rows × 4 columns

In [7]:
magnolia=pd.read_csv(name+'.csv')
stopwords = getstopwords()
In [8]:
removedchars=["'S VOICE", "'S WHISPER VOICE", " GATOR"]
for s in removedchars:
    magnolia['char']=magnolia['char'].apply(lambda x: x.replace(s, ""))
i=0
scenes=dict()
for s in magnolia.iterrows():
    scenes[s[1]['scene']]=[]
for s in magnolia.iterrows():
    scenes[s[1]['scene']].append(s[1]['char'])
for s in magnolia.iterrows():
    scenes[s[1]['scene']]=list(set(scenes[s[1]['scene']]))
In [9]:
characters=[]
for s in scenes:
    for k in scenes[s]:
        characters.append(k)
characters=list(set(characters))
appearances=dict()
for s in characters:
    appearances[s]=0
for s in magnolia.iterrows():
    appearances[s[1]['char']]=appearances[s[1]['char']]+1
In [10]:
a=pd.DataFrame(appearances, index=[i for i in range(len(appearances))])
In [11]:
finalcharacters=[]
for s in pd.DataFrame(a.transpose()[0].sort_values(0, ascending=False))[0:10].iterrows():
    finalcharacters.append(s[0])
In [12]:
finalcharacters
file=open(name+"_nodes.csv", "w")
couplesappearances=dict()
for s in finalcharacters:
    file.write(";")
    file.write(s)
file.write("\n")
for s in finalcharacters:
    newlist=[]
    for f in finalcharacters:
        newlist.append(0)
        couplesappearances[f+"_"+s]=0
    j=0
    for f in finalcharacters:
        for p in scenes:
            if f in scenes[p] and s in scenes[p] and f!=s and finalcharacters.index(f)<finalcharacters.index(s): 
                long=len(magnolia[magnolia["scene"]==p])
                newlist[j]=newlist[j]+long
                couplesappearances[f+"_"+s]=couplesappearances[f+"_"+s]+long
        j=j+1
    file.write(s)
    for f in newlist:
        file.write(";")
        file.write(str(f))
    file.write("\n")
file.close()
In [13]:
a=pd.DataFrame(couplesappearances, index=[i for i in range(len(couplesappearances))])
finalcouples=[]
for s in pd.DataFrame(a.transpose()[0].sort_values(0, ascending=False))[0:4].iterrows():
    finalcouples.append(s[0])
In [14]:
file=open(name+"_finalcharacters.csv", "w")
for s in finalcharacters:
    file.write(s+"\n")
file.close()
file=open(name+"_finalcouples.csv", "w")
for s in finalcouples:
    file.write(s+"\n")
file.close()
In [15]:
importantchars=[]
for char in appearances:
    if appearances[char]>10:
        importantchars.append(char)
In [16]:
file=open(name+"_sentiment_overtime_individual.csv", "w")
file2=open(name+"_sentiment_overtime_individualminsmaxs.csv", "w")

for k in finalcharacters:
    print(k)
    dd=getdialogue(magnolia, k, k, scenes)
    dd=[str(d) for d in dd]
    polarities, subjectivities=getsentiment(dd)
    %matplotlib inline
    import matplotlib.pyplot as plt
    moveda=maverage(polarities, dd, .99)
    plt.plot(moveda)
    i=0
    for s in moveda:
        file.write(k+","+str(float(i)/len(moveda))+", "+str(s)+"\n")
        i=i+1
    plt.ylabel('polarities')
    plt.show()
    file2.write(k+"| MIN| "+dd[moveda.index(np.min(moveda))]+"\n")
    file2.write(k+"| MAX| "+dd[moveda.index(np.max(moveda))]+"\n")
    print("MIN: "+dd[moveda.index(np.min(moveda))])
    print("\n")
    print("MAX: "+dd[moveda.index(np.max(moveda))])
    
file.close()
file2.close()

file=open(name+"_sentiment_overtime_couples.csv", "w")
file2=open(name+"_sentiment_overtime_couplesminsmaxs.csv", "w")

for k in finalcouples:
    print(k)
    liston=k.split("_")
    dd=getdialogue(magnolia, liston[0], liston[1], scenes)
    dd=[str(d) for d in dd]
    polarities, subjectivities=getsentiment(dd)
    %matplotlib inline
    import matplotlib.pyplot as plt
    moveda=maverage(polarities, dd, .99)
    plt.plot(moveda)
    i=0
    for s in moveda:
        file.write(k+","+str(float(i)/len(moveda))+", "+str(s)+"\n")
        i=i+1
    plt.ylabel('polarities')
    plt.show()
    file2.write(k+"| MIN| "+dd[moveda.index(np.min(moveda))]+"\n")
    file2.write(k+"| MAX| "+dd[moveda.index(np.max(moveda))]+"\n")
    print("MIN: "+dd[moveda.index(np.min(moveda))])
    print("\n")
    print("MAX: "+dd[moveda.index(np.max(moveda))])
    
file.close()
file2.close()
JIM KURRING
MIN: You mind if I check things back here? 


MAX: YOU'RE OK...you're gonna be ok....
JIMMY
MIN: She went crazy.  She went crazy, Rose. 


MAX: Imagine you are attending a jam session of classical composers and they have  each done an arrangment of the classic  favorite, "Whispering."  Here are three  variations on the theme, as three classic  composer's might have written it -- you are to name the composer.  The First: 
CLAUDIA
MIN: I'm sorry. 


MAX: Did you ever go out with someone and just....lie....question after question, maybe you're trying to  make yourself look cool or better  than you are or whatever, or smarter  or cooler and you just -- not really lie, but maybe you just don't say everything --
FRANK
MIN: If you feel, made to feel like you need them, like -- like you can't live if you're without them or you need, what?  They're pussy?  They're love? Fuck that.  Self Sufficient, gents.  That's the truth. What you are -- we are -- you need them  for what?  To fucking make you a piece of snot rag?  A puppett?  huh?  Hear them bitch and moan? bitch and moan --  and we're taught one thing -- go the other way -- there is No Excuse I will give you, I'm not gonna apologize -- I'm not gonna  apologize for my NEED my DESIRE...my, the  things that I need as a man to feel comfortable... You understand?  You understand?  You need to say something, "my mommy hit me or  daddy hit me or didn't let me play soccer,  so now I make mistakes, cause a that -- something, so now I piss and shit on it and do this." Bullshit.  I'm sorry. ok. yeah. no. fuck.  go.  fuck. alright. go make a new mistake. maybe not, I dunno...fuck.... 


MAX: I wouldn't want that to be misunderstood: My enrollment was totally unoffical because I was, sadly, unable to afford tuition up  there.  But there were three wonderful men who were kind enough to let me sit in on their classes, and they're names are:  Macready, Horn and Langtree among others. I was completely independent financially, and like I said: One Sad Sack A Shit.  So what we're looking at here is a true rags to riches story and I think that's  what most people respond to in "Seduce," And At The End Of The Day? Hey -- it may not  even be about picking up chicks and sticking your cock in it -- it's about finding What You Can Be In This World.  Defining It.  Controling It and  saying: I will take what is mine.  You just happen  to get a blow job out of it, then hey-what-the-fuck- why-not?  he.he.he.
PHIL
MIN: You wanna call him on the phone? We can call him, I can dial the  phone if you can remember the number -- 


MAX: Thank you, Chad, and good luck to you and your mother -- 
STANLEY
MIN: I think that you have to be nicer to me.


MAX: I'm fine. I'm fine, I just wanna keep playing --
DONNIE
MIN: My teeff...my teeef....


MAX: My name is Donnie Smith and I have lot's of love to give. 
EARL
MIN: No, no, the grade...the grade that you're in? 


MAX: "...it's not going to stop 'till you wise up..."
LINDA
MIN: listen...listen to me now, Phil:  I'm sorry, sorry I slapped your face.  ...because I don't know what I'm doing... ...I don't know how to do this, y'know?  You understand?  y'know?  I...I'm...I do things  and I fuck up and I fucked up....forgive me, ok? Can you just...


MAX: I'm listening.  I'm getting better. 
NARRATOR
MIN: -- added to this, Mr. Hansen's tortured life met before with Delmer Darion just two nights previous --


MAX: So Fay Barringer was charged with the  murder of her son and Sydney Barringer  noted as an accomplice in his own death...
JIM KURRING_CLAUDIA
MIN: You mind if I check things back here? 


MAX: ok. 
JIMMY_STANLEY
MIN: I don't mean to cry, I'm sorry. 


MAX: Imagine you are attending a jam session of classical composers and they have  each done an arrangment of the classic  favorite, "Whispering."  Here are three  variations on the theme, as three classic  composer's might have written it -- you are to name the composer.  The First: 
PHIL_EARL
MIN: -- it's not him. it's not him. He's the fuckin' asshole...Phil..c'mere... 


MAX: ...ah...maybe...yeah...she's a good one... 
FRANK_PHIL
MIN: When they put me on hold, to  talk to you...they play the tapes.  I mean: I'd seen the commercials and heard about you, but I'd never heard the tapes ....


MAX: I just...he was...but I gave him,  I just had to give him a small dose of  liquid morphine.  He hasn't been able to swallow the morphine pills so we now,  I just had to go to the liquid morphine... For the pain, you understand? 
In [17]:
for key, val in scenes.items():
    for s in scenes[key]:
        new="INSCENE_"+scenes[key][0]
        scenes[key].remove(scenes[key][0])
        scenes[key].append(new)
In [18]:
magnolia.dropna(subset=['dialogue'])
1
Out[18]:
1
In [19]:
baskets=[]
spchars=["\"", "'", ".", ",", "-"]
attributes=["?", "!"]
for s in magnolia.iterrows():
    if type(s[1]['dialogue'])!=float and  len(s[1]['dialogue'])>0:
        new=[]
        for k in scenes[s[1]['scene']]:
            new.append(k)
        new.append("SPEAKING_"+s[1]['char'])
        for k in s[1]['dialogue'].split(" "):
            ko=k
            for t in spchars:
                ko=ko.replace(t, "")
            for t in attributes:
                if ko.find(t)>=0:
                    new.append(t)
                    ko=ko.replace(t, "")
            if len(ko)>0:
                new.append(ko.lower())
        new=list(set(new))
        baskets.append(new)
In [20]:
baskets2=[]
basketslist=[]
for k in baskets:
    new=dict()
    new2=[]
    for t in k:
        if t not in stopwords:
            new[t]=1
            new2.append(t)
    baskets2.append(new)
    basketslist.append(new2)
In [21]:
baskets2=pd.DataFrame(baskets2)
from mlxtend.frequent_patterns import apriori
from mlxtend.frequent_patterns import association_rules
baskets2=baskets2.fillna(0)
baskets2.to_csv(name+'_basket.csv')
In [22]:
frequent_itemsets = apriori(baskets2, min_support=5/len(baskets2), use_colnames=True)
rules = association_rules(frequent_itemsets, metric="lift", min_threshold=1)
In [23]:
rules['one_lower']=[int(alllower(i) or alllower(j)) for i, j in zip(rules['antecedants'], rules['consequents'])]
In [24]:
rules['both_lower']=[int(alllower(i) and alllower(j)) for i, j in zip(rules['antecedants'], rules['consequents'])]
In [25]:
rules.to_csv(name+'_rules.csv', index=None)

Analisis de Sentimiento (Pelicula & Personaje)

Score por Pelicula

Titulo
.
TED
Numero de Palabras/Tokens en el texto original
Palabras Distintas
1859
Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 4.968215 12%
Porcentaje de Palabras encontradas por tipo de sentimiento (bing) 13.8%
sentiment Porcentaje
positive 55.4%
negative 44.6%
Porcentaje de Palabras encontradas por tipo de sentimiento (nrc) 19.1%
sentiment Porcentaje
positive 19.8%
negative 14.8%
trust 12.1%
joy 11.5%
anticipation 10.5%
fear 6.9%
anger 6.7%
sadness 6.3%
surprise 6.1%
disgust 5.3%
Porcentaje de Palabras encontradas por tipo de sentimiento (loughran) 5%
sentiment Porcentaje
negative 35.9%
positive 35.0%
uncertainty 24.9%
litigious 3.0%
constraining 1.3%

Score por Personaje

[1] “Analisis de Sentimientos del Personaje: TED” [1] “Numero total de Palabras Unicas en el texto: 658”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 4.946809 13.1%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 11.2%
sentiment Porcentaje
positive 58.5%
negative 41.5%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 16.9%
sentiment Porcentaje
positive 19.0%
negative 14.7%
trust 12.7%
anticipation 12.2%
joy 11.0%
fear 7.5%
sadness 6.5%
anger 6.2%
surprise 6.0%
disgust 4.2%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 4.86%
sentiment Porcentaje
positive 40.0%
uncertainty 30.9%
negative 29.1%

[1] “Analisis de Sentimientos del Personaje: MARY” [1] “Numero total de Palabras Unicas en el texto: 615”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 5.877698 9.92%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 10.4%
sentiment Porcentaje
positive 70.9%
negative 29.1%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 15.6%
sentiment Porcentaje
positive 22.7%
joy 15.2%
trust 12.5%
anticipation 11.2%
negative 10.7%
surprise 8.3%
sadness 5.9%
fear 5.1%
anger 4.5%
disgust 4.0%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 5.04%
sentiment Porcentaje
positive 35.3%
uncertainty 35.3%
negative 21.6%
constraining 3.9%
litigious 3.9%

[1] “Analisis de Sentimientos del Personaje: HEALY” [1] “Numero total de Palabras Unicas en el texto: 711”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 4.897297 11%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 11.1%
sentiment Porcentaje
positive 54.75%
negative 45.25%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 15.6%
sentiment Porcentaje
positive 19.4%
negative 15.3%
trust 12.3%
joy 12.0%
anticipation 10.5%
surprise 6.6%
sadness 6.4%
anger 6.1%
disgust 5.9%
fear 5.4%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 3.23%
sentiment Porcentaje
positive 45.2%
negative 31.0%
uncertainty 23.8%

[1] “Analisis de Sentimientos del Personaje: DOM” [1] “Numero total de Palabras Unicas en el texto: 411”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 4.705128 13.4%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 12.4%
sentiment Porcentaje
negative 50%
positive 50%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 14.1%
sentiment Porcentaje
positive 20.9%
negative 16.6%
trust 13.5%
joy 12.9%
anger 8.0%
anticipation 8.0%
fear 5.5%
surprise 5.5%
sadness 4.9%
disgust 4.3%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 3.89%
sentiment Porcentaje
negative 47.6%
positive 33.3%
litigious 9.5%
uncertainty 9.5%

[1] “Analisis de Sentimientos del Personaje: TUCKER” [1] “Numero total de Palabras Unicas en el texto: 395”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 4.821429 11.6%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 11.4%
sentiment Porcentaje
positive 50.88%
negative 49.12%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 15.7%
sentiment Porcentaje
positive 20.7%
negative 15.9%
trust 11.0%
fear 9.8%
anger 9.1%
sadness 9.1%
joy 8.5%
anticipation 6.7%
disgust 5.5%
surprise 3.7%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 4.05%
sentiment Porcentaje
negative 55.6%
positive 22.2%
uncertainty 11.1%
constraining 5.6%
litigious 5.6%

[1] “Analisis de Sentimientos del Personaje: MAGDA” [1] “Numero total de Palabras Unicas en el texto: 253”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 4.263158 12.6%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 12.6%
sentiment Porcentaje
negative 65.8%
positive 34.2%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 13.8%
sentiment Porcentaje
negative 21.1%
fear 12.3%
positive 12.3%
anticipation 10.5%
anger 9.6%
sadness 8.8%
disgust 7.9%
joy 7.0%
surprise 5.3%
trust 5.3%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 3.16%
sentiment Porcentaje
negative 80%
positive 20%

[1] “Analisis de Sentimientos del Personaje: SULLY” [1] “Numero total de Palabras Unicas en el texto: 125”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 3.8 13.6%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 16%
sentiment Porcentaje
negative 58.3%
positive 41.7%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 12.8%
sentiment Porcentaje
negative 25.5%
disgust 15.7%
anger 11.8%
fear 11.8%
anticipation 7.8%
joy 5.9%
positive 5.9%
sadness 5.9%
trust 5.9%
surprise 3.9%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 4.8%
sentiment Porcentaje
negative 50.0%
positive 37.5%
uncertainty 12.5%

[1] “Analisis de Sentimientos del Personaje: MARY’S MOM” [1] “Numero total de Palabras Unicas en el texto: 95”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 5.6 5.26%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 8.42%
sentiment Porcentaje
negative 50%
positive 50%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 9.47%
sentiment Porcentaje
positive 31.2%
joy 18.8%
anticipation 12.5%
negative 12.5%
trust 12.5%
sadness 6.2%
surprise 6.2%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 1.05%
sentiment Porcentaje
negative 100%

[1] “Analisis de Sentimientos del Personaje: MARY’S DAD” [1] “Numero total de Palabras Unicas en el texto: 90”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 5 6.67%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 7.78%
sentiment Porcentaje
positive 62.5%
negative 37.5%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 8.89%
sentiment Porcentaje
anticipation 14.29%
negative 14.29%
positive 14.29%
sadness 14.29%
anger 9.52%
fear 9.52%
trust 9.52%
disgust 4.76%
joy 4.76%
surprise 4.76%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 1.11%
sentiment Porcentaje
negative 100%

[1] “Analisis de Sentimientos del Personaje: WARREN” [1] “Numero total de Palabras Unicas en el texto: 26”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 5.5 7.69%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 7.69%
sentiment Porcentaje
negative 50%
positive 50%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 11.5%
sentiment Porcentaje
anticipation 25.0%
anger 12.5%
joy 12.5%
negative 12.5%
positive 12.5%
surprise 12.5%
trust 12.5%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 3.85%
sentiment Porcentaje
positive 100%

[1] “Analisis de Sentimientos del Personaje: FRIEND #1” [1] “Numero total de Palabras Unicas en el texto: 79”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 3.818182 11.4%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 11.4%
sentiment Porcentaje
negative 63.6%
positive 36.4%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 16.5%
sentiment Porcentaje
positive 32%
anticipation 16%
joy 12%
negative 12%
trust 12%
disgust 8%
anger 4%
fear 4%

Table: Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 0%

sentiment Porcentaje ———- ————

[1] “Analisis de Sentimientos del Personaje: HITCHHIKER” [1] “Numero total de Palabras Unicas en el texto: 121”

Escala de Sentimientos entre negativos y positivos: afinn
Descripcion Score % Founded Words
Entre 0 (negativo) y 10 (positivo) 6.285714 5.79%
Porcentaje de Palabras encontradas por tipo de sentimiento ( bing ) 7.44%
sentiment Porcentaje
positive 80%
negative 20%
Porcentaje de Palabras encontradas por tipo de sentimiento ( nrc ) 8.26%
sentiment Porcentaje
positive 34.6%
trust 23.1%
anticipation 15.4%
joy 15.4%
surprise 7.7%
anger 3.8%
Porcentaje de Palabras encontradas por tipo de sentimiento ( loughran ) 4.13%
sentiment Porcentaje
positive 40%
litigious 20%
negative 20%
uncertainty 20%

Score por Personaje en el tiempo

Top 10 Personajes

Dialogos cúspide por Top 10 Personajes: aboutmary
Personaje Min_Max Dialogo
TED MIN You’re Woogie?
TED MAX No, it’s an old football injury.
MARY MIN But I think I’d be happiest…with you.
MARY MAX That’s right. And the good thing is you can do it anywhere.
HEALY MIN He’s no friend of mine.
HEALY MAX Fine. Fine.
DOM MIN Dom Wooganowski. Duh.
DOM MAX Hey, look on the bright side–
TUCKER MIN I’ve got a friend in the Boston police department. He faxed me this this morning. I’ll just give you the highlights. After a short stint as a petty thief, Patrick R. Healy graduated to armed robbery by the age of fourteen. At sixteen he committed his first murder–a pretty teacher’s aid named Molly Pettygrove. He was incarcerated until age twenty-two when, despite a grim psychological profile, the state was forced to release him. In his mid-twenties and again in his early thirties he was suspected of homicides in the states of Utah and Washington. Unfortunately, the bodies were so badly decomposed that there wasn’t enough evidence to hold him, and on and on and so forth and so on.
TUCKER MAX You heard me, goddamnit. I…I love her.
MAGDA MIN That’s because there’s a lot of bad people out there. Hey, Puffy tried to warn you about that Steve guy you was seeing–he was a fucking asswipe–but you had to find out for yourself, didn’t you?
MAGDA MAX Knock it off, Pollyanna, just ’cause you’re in love doesn’t mean everyone else has to be.
SULLY MIN That’s bullshit, man you, uh, you were on the front line. Remember the, uh, malaria the, uh, typhoon fever that vicious strain of genital herpes?
SULLY MAX Fuckin’ Patrick Healy, you think your shit don’t stink. Well I got news for you–you’re goddamn right it don’t! How the hell are ya?!
MARY’S MOM MIN Charlie, do something.
MARY’S MOM MAX Charlie, that’s mean. Come on in, Ted. Don’t listen to Mr. Wise Guy here. He’s a joke a minute.
MARY’S DAD MIN What the hell are you doing?!
MARY’S DAD MAX How the hell’d you get the beans all the way up top like that?
WARREN MIN Huh?
WARREN MAX Good, Ted. Piggy back ride?

Top 4 Parejas

Dialogos cúspide por Top 4 Parejas: aboutmary
Parejas Min_Max Dialogo
MARY_HEALY MIN Oh. So…what brings you down here?
MARY_HEALY MAX We just got here thirty seconds ago. Isn’t this stuff great?
TED_MARY MIN Hi, Ted.
TED_MARY MAX Ted, are you okay?
HEALY_TUCKER MIN Dom, you’re pathetic, fucking over your friend Ted like that.
HEALY_TUCKER MAX Look, you asked me to follow your girl around, and I did and I started to like her, and then I realized I just couldn’t in good conscience do it.
TED_DOM MIN Dom Wooganowski. Duh.
TED_DOM MAX Maybe you’re right. I should look on the bright side. I mean, I’ve still got my health… I’m out of here. I’ve got to get up at six a.m. to move my boss’s brother into his apartment.

Reglas de Asociación entre palabras (Market Basket)

Toda la pelicula

## [1] "Lift Promedio de las Reglas de Asociacion: 20.3813707550261"
## [1] "Desviación estandar del Lift de las Reglas de Asociacion: 13.3145326352053"
## [1] "Deciles del Lift : "
##        10%        20%        30%        40%        50%        60% 
##   1.832827   4.855072  10.806452  16.209677  23.928571  23.928571 
##        70%        80%        90%       100% 
##  29.558824  29.558824  35.892857 167.500000

Datos del Histograma: Lift Pelicula: TED
Numero de Dialogos Lift Minimo Lift Maximo
44,544 -3 3
41,594 3 9
18,650 9 14
28,586 14 20
71,190 20 26
64,044 26 32
## [1] "Leverage Promedio de las Reglas de Asociacion: 0.00810205979613019"
## [1] "Desviación estandar del Leverage de las Reglas de Asociacion: 0.00700157827387452"
## [1] "Deciles del Leverage : "
##         10%         20%         30%         40%         50%         60% 
## 0.003588030 0.004767209 0.004806812 0.005584020 0.005828569 0.006674092 
##         70%         80%         90%        100% 
## 0.007627534 0.010487859 0.012497710 0.104923145

Datos del Histograma: Leverage pelicula: TED
Numero de Dialogos Leverage Minimo Leverage Maximo
7,148 -0.0018 0.0018
112,936 0.0018 0.0054
121,294 0.0054 0.009
39,152 0.009 0.013
4,612 0.013 0.016
4,692 0.016 0.02

Top 10 Personajes

Top 4 Parejas

Analisis de Relaciones entre Personajes (Pagerank)

Pagerank: About Marie.

Pagerank: About Marie.